BMC Nephrology
○ Springer Science and Business Media LLC
Preprints posted in the last 30 days, ranked by how well they match BMC Nephrology's content profile, based on 13 papers previously published here. The average preprint has a 0.02% match score for this journal, so anything above that is already an above-average fit.
Gollie, J.; Ryan, A. S.; Harris-Love, M. O.; Kokkinos, P.; Scholten, J.; Pugh, R. J.; Hazel, C. G.; Blackman, M. R.
Show abstract
Physical inactivity is common in chronic kidney disease (CKD) and is associated with poor neuromuscular and functional outcomes. Whether habitual physical activity (PA) influences adaptations to structured exercise in CKD remains unclear. This study examined if adaptations to combined flywheel resistance and aerobic exercise (FRE+AE) differed based on self-reported PA in Veterans with CKD stages 3 and 4. Twenty older male Veterans with CKD stages 3-4 (mean eGFR 37.9 +/- 10.2 mL/min/1.73 m2) were randomized to six weeks of FRE+AE (n=11) or health education (EDU; n=9). Participants were classified as meeting (Meets PA) or below (Low PA) weekly moderate intensity PA recommendations using the 7-day Physical Activity Recall. Outcomes included vastus lateralis muscle thickness (VL MT), knee extensor power output (60/s and 180/s), gait speed (GS), and five-repetition sit-to-stand (STS). FRE+AE increased VL MT (p=0.030), power output at 180/s (p=0.021), GS (p=0.001), and reduced STS time (p=0.012), with significant between-group differences versus EDU for VL MT (p=0.009) and GS (p=0.028). Low PA experienced greater increases in power output at 60/s (Hedges g; Low PA=0.44, Meets PA=0.25) and 180/s (Hedges g; Low PA=1.38, Meets PA=0.38) compared to Meets PA after FRE+AE. Conversely, Meets PA had greater improvements in GS (Hedges g; Low PA=0.93, Meets PA=1.29) and STS (Hedges g; Low PA=-0.72, Meets PA=-2.20) compared to Low PA. Six weeks of FRE+AE produced clinically meaningful neuromuscular and functional improvements in Veterans with CKD stages 3 and 4 irrespective of PA level, supporting FRE+AE as a feasible intervention in this population.
Melville, S.; MacKinnon, M.; Michaud, J.
Show abstract
BackgroundLife-sustaining hemodialysis (HD) is onerous for patients, especially those with multiple co-morbidities and advanced age. A standard HD prescription is 720 minutes per week. Alternative HD regiments have been proposed in attempt to maintain quality of life (QOL). Studies are needed to investigate the efficacy and safety of less frequent HD prescriptions in this population. This is an institution-wide observational study in New Brunswick, Canada to compare HD prescriptions and the impact on QOL and mortality. ObjectiveThe purpose of this study is to assess the current HD prescribing practices at a provincial healthcare institution in relation to patient QOL. DesignProspective Observational Study. SettingSingle centre hospital and satellite hemodialysis units. PatientsVoluntarily consented patients undergoing in-centre hemodialysis treatment. MeasurementsObservational clinical data was collected for each study participant from their hospital and dialysis electronic medical records. The KDQOL-36TM questionnaire was used to assess patient-reported quality of life at the time of consent. MethodsAdults undergoing in-centre or satellite site HD for at least 3 months were eligible to participate. Consenting patient participants were grouped by HD prescription whether they were prescribed 720 minutes or more per week or less than 720 minutes per week. All participants completed the KDQOL-36 TM questionnaire to estimate QOL and groups were compared using the Mann-Whitney U statistical test. Emergency department visits, hospitalizations, and mortality were analyzed using a negative binomial regression or a logistic regression. ResultsWe enrolled 140 patient participants; 41 were undergoing less than 720 minutes per week of HD and 99 were undergoing 720 minutes or more of HD per week. Patients who were undergoing less than 720 minutes per week of HD were older [Median (IQR): 76 (72- 81) yrs. vs. 64 (55 - 75) yrs.; p < 0.001], had higher median (IQR) QOL scores on the Symptoms/ Problems List scale on the KDQOL-36 TM questionnaire [79.2 (70.8 - 88.5 vs. 70.8 (62.5 - 81.3); p = 0.0022], and were less likely to present to the emergency department (incident rate ratio 0.52, 95% confidence interval [CI] 0.33-0.81). Mortality was similar between groups, even when adjusted for age and comorbidity score (odds ratio 1.62, 95% CI 0.59-4.49). LimitationsPatient participant enrollment was limited by the single centre nature of this study. As this was an observational study, we did not account for how long the patients had been prescribed less than 720 minutes of hemodialysis. We did not include a frailty assessment of the study participants. A higher number of study participants may have identified significant trends in mortality. ConclusionsThe results of this study show that patients undergoing less than 720 minutes of weekly HD had a higher QOL score for the KDQOL-36 TM Symptoms/ Problems List scale, were less frequently in the emergency department and were not more likely to die than patients undergoing 720 minutes or more of weekly HD. Further studies are required to assess the feasibility and safety of a conservative model of HD prescribing to improve QOL of patients with palliative care treatment goals.
Zhang, X.; Ping, Z.
Show abstract
Background: The association between physical activity (PA) and chronic kidney disease (CKD) was initially explored; however, it remained unknown whether PA affected the incidence of CKD through the inflammation pathway. Moreover, objective PA measured by accelerometers was rarely considered for this association. Methods: This study was performed in a large-scale prospective cohort with two different sub-cohorts: the International Physical Activity Questionnaire (IPAQ)-measured cohort (N=314,694); and the device-measured cohort (N=79,454). Cox models were conducted to assess the association of PA with incident CKD. The mediating role of inflammation in such association was investigated by four inflammation metrics [C-reactive protein (CRP), white blood cell (WBC), a low-grade inflammation (INFLA) score, and monocyte to high-density lipoprotein cholesterol ratio (MHR)]. Results: In the questionnaire-measured cohort, compared to low PA, moderate and high PA reduced the risk of CKD by approximately 28.0% (95% CI 24.4[~]31.5%) or 37.6% (34.4[~]40.7%), respectively. Inflammation significantly mediated this association, with the mediation proportion was 4.1% (3.0[~]5.1%), 1.4% (1.1[~]1.7%), 9.8% (7.7[~]11.9%), and 1.4% (1.1[~]1.7%) for CRP, WBC, INFLA score, and MHR, respectively. Evidence from the device-measured cohort further strengthened the robustness of our findings, but the effects were somewhat attenuated, with the mediation proportion being 2.2% (1.2[~]3.2%), 0.8% (0.2[~]1.3%), 4.3% (2.5[~]6.0%), and 1.3% (0.6[~]2.1%) for CRP, WBC, INFLA score, and MHR, respectively. Conclusions: Our study reveals suggestive evidence for the association of active PA with reduced CKD risk and further demonstrates the mediating role of inflammation in such association, providing a novel perspective for the early prevention of CKD.
SAXENA, J. N.; Potturu, D. V. P.; Nagraj, A.
Show abstract
Background: Chronic kidney disease (CKD) affects approximately 850 million individuals worldwide and remains a leading cause of morbidity, premature mortality, and escalating healthcare costs. Despite the availability of clinical biomarkers, CKD progression to end stage renal disease (ESRD) is frequently identified late, limiting opportunities for preventive intervention. Conventional predictive models have relied predominantly on static cross sectional laboratory values, failing to capture the temporal dynamics of disease trajectory that longitudinal claims data can provide. Objective: This study proposes a novel hybrid machine learning framework: XGBoost LSTM Attention (XLA), that integrates gradient boosted feature selection with long short-term memory (LSTM) networks and a temporal attention mechanism to improve early prediction of CKD progression from Stage 3 to Stages 4/5 or ESRD using longitudinal claims based features. Methods: We conducted two complementary analyses. Primary analysis: a cross sectional validation using real NHANES 2015 to 2018 data (n=701 CKD Stage 3 adults) predicting significant proteinuria (UACR greater than or equal to 30 mg/g) from clinical features excluding UACR. Supplementary analysis: an NHANES-calibrated longitudinal cohort (n=8,412) with simulated quarterly measurements demonstrated XLA performance under real world longitudinal data conditions. All models were evaluated using 5-fold stratified cross-validation. Results: In the primary NHANES cross sectional analysis, the XLA framework achieved AUC ROC of 0.684 (95% CI: 0.641 to 0.727), with all models performing comparably (AUC 0.684 to 0.710), confirming that cross sectional clinical features alone provide limited signal for proteinuria prediction and underscoring the necessity of UACR measurement. In the longitudinal supplementary analysis, XLA achieved AUC ROC of 0.994 versus 0.939 for the best cross-sectional baseline (+5.5%), demonstrating that temporal trajectory features particularly eGFR slope and RAAS adherence trends: confer substantial incremental predictive value when longitudinal data are available. Conclusion: The XLA framework demonstrates meaningful advantages over traditional approaches when applied to longitudinal claims data. Cross sectional findings highlight the irreplaceable role of direct UACR measurement in CKD risk stratification. Together, these results provide actionable evidence for both the limitations of static prediction and the promise of trajectory based approaches in value based care programs managing large CKD populations. Keywords: chronic kidney disease, CKD progression, machine learning, XGBoost, LSTM, temporal attention, claims data, NHANES, proteinuria, healthcare informatics, value based care.
Schobert, M.; Boehm, S.; Borisov, O.; Li, Y.; Greve, G.; Edemir, B.; Woodward, O. M.; Jung, H. J.; Koettgen, M. M.; Westermann, L.; Schlosser, P.; Hutter, F.; Kottgen, A.; Haug, S.
Show abstract
BackgroundKidney cell lines are widely used to model kidney physiology and disease; however, their gene expression profiles may differ from primary cells due to immortalization, culture conditions, or experimental treatments. Determining whether a cell line resembles its native cell type is critical for interpreting in vitro findings. We developed a transcriptome-based approach that matches bulk RNA-seq data from kidney cell lines, primary cells, or tissues to reference cell types derived from single-cell RNA-seq (scRNA-seq) datasets. MethodsReference transcriptomic profiles were generated from two human and two murine kidney scRNA-seq datasets by pseudobulk aggregation. Bulk RNA-seq data from microdissected kidney tissue, non-kidney negative controls, and kidney cell lines were matched to these references using three statistical similarity measures (Spearman correlation, Euclidean distance, Poisson distance) and three machine learning classifiers (Random Forest, XGBoost, TabPFN). Each was assessed with global gene expression, curated kidney marker gene lists, and the most variable genes. Matching accuracy was evaluated through a three-step validation strategy: within-dataset matching, cross-reference comparison, and validation against primary kidney tissue and negative controls. ResultsGene expression rank-based Spearman correlation and TabPFN, a foundation model for tabular data, emerged as the most accurate and specific approaches, particularly with curated kidney marker gene lists. Both methods correctly identified microdissected kidney tubule segments and were robust against non-kidney negative controls. Applied to commonly used kidney cell lines, OK cells retained proximal tubule identity, particularly under shear stress, while other proximal tubule lines (HK-2, HKC-8, HKC-11) showed inconsistent matching. Collecting duct-derived mIMCD-3 maintained stable similarity across passages, culture conditions, and genetic modifications. ConclusionWe provide two complementary implementations: CellMatchR, an accessible web-based tool using Spearman correlation for routine use, and comprehensive scripts for TabPFN-based matching (link will be added after peer reviewed publication). Together, these resources enable researchers to make informed decisions about kidney cell culture model selection, interpretation, and stability. Translational StatementKidney cell lines are fundamental tools in nephrology research, yet their transcriptomic similarity to native cell types is rarely validated systematically. We demonstrate that combining bulk RNA-seq data with single-cell reference datasets enables robust assessment of cell line identity using gene expression-rank-based correlation and machine learning approaches. By providing a comprehensive evaluation of matching methods, curated kidney marker gene lists, and reference datasets, our study serves as both a practical resource and a methodological framework for the kidney research community, facilitating informed selection of cell culture models, quality control of experimental conditions, developing new experimental cell culture models, and more reliable translation of in vitro findings to kidney physiology and disease.
Ren, Y.; Shafi, T.; Segal, M. R.; Li, H.; Pico, A. R.; Shin, M.-G.; Schelling, J. R.; Hulleman, J. D.; He, J.; Li, C.; Choles, H. R.; Brown, J.; Dobre, M. A.; Mehta, R.; Deo, R.; Srivastava, A.; Taliercio, J.; Sozio, S. M.; Jaar, B.; Estrella, M. M.; Chen, W.; Chertow, G. M.; Parekh, R.; Ganz, P.; Dubin, R.; CRIC Study Investigators,
Show abstract
Background: Patients with kidney failure undergoing maintenance hemodialysis suffer high rates of major adverse cardiovascular events(MACE) that are not accurately predicted by traditional cardiovascular risk models. There is an urgent need to identify novel, modifiable cardiovascular risk factors for these patients. Methods: We analyzed associations of 6287 circulating proteins with MACE among 1048 participants undergoing hemodialysis in the Chronic Renal Insufficiency Cohort(CRIC) (14-year follow-up) with validation in the Predictors of Arrhythmic and Cardiovascular Risk in End-Stage Renal Disease study(PACE) (7-year follow-up). In both cohorts, proteins were measured shortly after dialysis initiation and one year later. We compared protein-based risk models derived by elastic net regression to the Pooled Cohort Equations(PCE) optimized for these cohorts(Refit PCE), and to an Expanded Refit PCE that included Troponin T and N-terminal pro-B-type natriuretic peptide. Results: In CRIC, 149 proteins were associated with MACE at false discovery rate<0.05. Among 22 proteins significant at Bonferroni p<8x10-6, proteins that validated in PACE included Sushi von Willebrand factor type A EGF and pentraxin domain-containing protein 1(SVEP1), Complement component C7, R-spondin 4, Tenascin, Fibulin-3 and Fibulin-5. Complement pathways were prominent in network analyses. SVEP1 surpassed other markers by statistical significance, with CRIC HR per log2 1.8 (p=2.1x10-12) and HR per annual doubling 1.6 (p=6.8x10-6). For 2-year MACE, AUC(95%CI) for SVEP1 alone was 0.72(0.59, 0.84) in CRIC, and 0.73(0.63, 0.81) in PACE. SVEP1 surpassed the Expanded Refit PCE in CRIC (0.61 (0.48, 0.73)) (p=0.038). In the pooled CRIC + PACE cohort, SVEP1 AUC(95%CI) (0.79(0.70, 0.88)) surpassed Refit PCE (0.61(0.51, 0.72)) (p=0.004). Conclusions: SVEP1, a 390 kDa protein unlikely to be renally cleared, surpassed over 6000 other proteins and by itself outperformed traditional clinical risk models in predicting MACE in two populations of patients undergoing maintenance hemodialysis. Future studies should provide mechanistic insights behind these findings.
Ahmadi, A.; Rahaman, M.; Harsh, A.; Yang, J.; Ghanim, B.; Dasgupta, S.; Weinreb, R. N.; Rahman, T.; Houben, A. J. H. M.; Ix, J. H.; Malhotra, R.
Show abstract
Background: Microvascular dysfunction contributes to chronic kidney disease (CKD), but reproducible clinical measures are limited. Laser Doppler flowmetry (LDF) provides a noninvasive assessment of cutaneous microvascular blood flow and may reflect systemic microvascular health. Its relationship with kidney function and histopathology in CKD remains unclear. Methods: We assessed cutaneous microvascular function in 150 participants with CKD (eGFR <90 mL/min/1.73 m2) using a standardized forearm LDF protocol. Baseline perfusion was recorded at ~30{degrees}C, followed by local heating to 44{degrees}C to induce hyperemia. Percent change in perfusion units (PU) defined microvascular functional reserve. Associations of LDF measures with eGFR and urine protein-to-creatinine ratio (uPCR) were evaluated using multivariable linear regression. K-means clustering identified microvascular phenotypes. In a subset (n=20), associations with glomerulosclerosis (GS) and interstitial fibrosis/tubular atrophy (IFTA) were examined. Results: The mean (SD) age was 64 (14) years, 46% were female. The mean eGFR was 42 (21) mL/min/1.73m2 and median uPCR was 0.21 (interquartile range (IQR) 0.11 to 1.20) mg/mg. Higher baseline PU ({beta} = -12; 95% CI, -24 to -1) and reduced percentage change in PU ({beta} = 7; 95% CI, 2 to 13) were associated with lower eGFR, independent of covariates. Neither measure was associated with uPCR. Clustering identified four phenotypes with graded differences in perfusion and reserve. In biopsy participants, higher baseline PU and lower percent change were associated with greater GS and IFTA severity. Conclusion: CKD is characterized by elevated resting perfusion and impaired microvascular reserve, which are associated with lower eGFR and histopathologic injury.
Miura, A.; Okabe, M.; Okabayashi, Y.; Sasaki, T.; Haruhara, K.; Tsuboi, N.; Yokoo, T.
Show abstract
Background: Single-nephron glomerular filtration rate (GFR) represents a nephron-level functional index that may reveal key pathophysiological mechanisms driving progression in patients with diabetic nephropathy. However, its clinical relevance remains incompletely understood. This cross-sectional study assessed single-nephron estimated GFR (eGFR) across different chronic kidney disease (CKD) stages in patients with advanced diabetic nephropathy. Methods: Nephron number was estimated as the number of nonglobally sclerotic glomeruli per kidney using computed tomography-derived cortical volume combined with biopsy stereology. Single-nephron eGFR was calculated by dividing eGFR by the nephron number of both kidneys. Patients were stratified according to CKD stage at kidney biopsy. Associations between CKD stages and single-nephron eGFR were evaluated using multivariable linear regression models adjusted for age, sex, urinary protein excretion, and eGFR. Results: The study included 105 patients with biopsy-proven diabetic nephropathy and overt proteinuria (median age 59 years, 83% male, HbA1c 6.6%, 57% had nephrotic range proteinuria). The percentage of globally sclerotic glomeruli, mesangial expansion score, and prevalence of nodular lesions increased significantly with advancing CKD stage. Median nephron number declined from 529,178 to 224,458 per kidney, whereas glomerular volume remained constant. Single-nephron eGFR decreased markedly with CKD stage and remained significantly inversely associated with CKD stage after adjustment for clinicopathologic covariates (P for trend <0.001). Conclusion: In overt diabetic nephropathy, single-nephron eGFR decreased with advancing CKD stage, despite relatively preserved glomerular volume. At this stage of disease, structural alterations specific to diabetic nephropathy may impair effective single-nephron filtration capacity.
Singh, S.; Patel, S. K.; Matsuura, R.; Velazquez, D.; Sun, Z.; Noel, S.; Rabb, H.; Fan, J.
Show abstract
Background: Kidney transplantation is the preferred treatment strategy for end-stage kidney disease. Deceased donor kidneys usually undergo cold storage until kidney transplantation, leading to cold ischemia injury that may contribute to poor graft outcomes. However, the molecular characterization of potential mechanisms of cold ischemia injury remains incomplete. Results: To bridge this knowledge gap, we leveraged the 10x Visium spatial transcriptomic technology to perform full transcriptome profiling of murine kidneys subject to varying durations of cold ischemia typical in a deceased donor kidney transplant setting. We developed a computational workflow to identify and compare spatiotemporal transcriptomic changes that accompany the injury pathophysiology in a tissue compartment-specific manner. We identified proportional enrichment of oxidative phosphorylation (OXPHOS) genes with increasing duration of cold ischemia injury within the oxygen-lean inner medulla region, suggestive of atypical metabolic presentation. This was distinct in cold ischemia injury tissue compared to warm ischemia-reperfusion kidney injury tissue. Spatiotemporal trends were validated by qPCR and immunofluorescence in a larger cohort of mice. We provide an interactive online browser at https://jef.works/CellCarto-ColdIschemia/ to facilitate exploration of our results by the broader scientific and clinical community. Conclusions: Altogether, our spatiotemporal transcriptomic analysis identified coordinated molecular changes within metabolic pathways such as OXPHOS deep within the cold ischemic kidney, highlighting the need for increased attention to the inner medulla and potential opportunities for new insights beyond those available from superficial biopsy-focused tissue examinations.
Nishida, T.; Hanamura, I.; Honda, S.; Honda, A.
Show abstract
Objectives: Cardiovascular disease (CVD) is a leading cause of mortality and disability in older populations. This study aimed to identify CVD risk factors in community-dwelling older adults and to examine whether frailty-related factors (sarcopenia and nutritional status) interact with chronic kidney disease (CKD). Methods: This cross-sectional study included 307 community-dwelling Japanese adults aged [≥]65 years between September 2024 and March 2025. CVD history was assessed based on self-reported physician diagnoses obtained through a structured questionnaire. Lifestyle-related factors included hypertension, diabetes, dyslipidemia, and body mass index (BMI). Frailty-related factors included sarcopenia (Asian Working Group for Sarcopenia 2019 criteria), nutritional status (Mini Nutritional Assessment-Short Form), and physical activity (International Physical Activity Questionnaire-Short Form). CKD was defined using the estimated glomerular filtration rate (eGFR): non-CKD ([≥]60 mL/min/1.73 m2) and CKD (<60 mL/min/1.73 m2). Multivariable logistic regression identified independent correlates of CVD, and interactions between CKD and frailty-related factors were tested. Results: The prevalence of CVD was 17.9%. Independent correlates included CKD (aOR 5.0), hypertension (aOR 4.0), male sex (aOR 3.1), undernutrition (aOR 2.7), sarcopenia (aOR 2.7), and low physical activity (aOR 2.5). No significant interactions were observed between CKD and sarcopenia (p = 0.70) or nutritional status (p = 0.40). Conclusions: CKD, sarcopenia, undernutrition, and low physical activity were independently associated with CVD, with no interaction between CKD and frailty factors. These findings suggest that integrated management addressing both renal function and frailty-related factors may be important for CVD prevention in older adults.
Huang, L.; Xu, X.; Matsushita, K.; Brady, T. M.; Appel, L. J.; Hoorn, E. J.; Tian, M.; Aminde, L. N.; Trieu, K.; Neal, B.; Marklund, M.
Show abstract
ABSTRACT Objective To estimate the benefit and risk of replacing regular salt with potassium-enriched salt. Design Comparative risk assessment modelling. Setting Worldwide Participants Adult populations aged 25 and above. Intervention (1) worldwide replacement of all salt (discretionary salt used for seasoning or cooking in the home, and non-discretionary salt used in processed and restaurant foods); (2) worldwide replacement of just discretionary salt; (3) worldwide replacement of just non-discretionary salt; (4) replacement of discretionary salt just for people with diagnosed hypertension; and (5) replacement of discretionary salt just for people with treated hypertension. Main outcome measures For scenarios 1-3, we estimated benefits including deaths, new cases and disability-adjusted-life-years (DALYs) from cardiovascular disease and chronic kidney disease (CKD), from blood pressure-lowering as well as harms (CVD deaths) caused by hyperkalaemia among people with CKD stages G3-G5. Results Replacement of all salt worldwide could prevent 2.96 (95% uncertainty interval 2.81-3.12) million deaths, 10.17 (9.59-10.70) million new cases of disease and 69.43 (65.61-72.92) million disability-adjusted life years (DALYs) each year. These figures represent 14.6%, 13.1% and 16.5% of the annual global disease burden attributable to CVD and CKD. Replacement of all discretionary salt (1.85, 1.74-1.97 million deaths) would have a greater impact on mortality than replacement of all non-discretionary salt (1.56, 1.46-1.67 million deaths). In people with CKD Stage G3-G5, there would be a net benefit - replacement of all salt would prevent 0.75 (0.71-0.80) million deaths but might cause 0.10 (0.09-0.11) million deaths from hyperkalaemia. Discretionary salt replacement only among diagnosed or treated hypertensives would prevent 0.59 (0.55-0.63) million and 0.48 (0.45-0.52) million deaths, respectively. Conclusion Switching regular salt to potassium-enriched salt appears to offer large potential for health gains under diverse scenarios, including for people with CKD.
Gittus, M.; Pitcher, D.; O'Cathain, A.; Ong, A. C. M.; Simms, R.; Fotheringham, J. B.
Show abstract
Background and hypothesis Autosomal dominant polycystic kidney disease (ADPKD) affects over 12 million people worldwide including an estimated 30,000-70,000 in the United Kingdom (UK). Tolvaptan is the only disease-modifying therapy approved for rapidly progressing disease. Despite national guidance, prescribing rates were hypothesised to vary by kidney centre. Treatment may not always align with guidelines: some patients eligible for tolvaptan may not be initiated, while other patients initiated on tolvaptan may not meet eligibility criteria. This may have important consequences for healthcare costs and health-related quality of life. Methods The National Registry of Rare Kidney Diseases (RaDaR) collects longitudinal data from UK NHS kidney centres. This retrospective cohort study used routinely collected data (2016-2023) to examine tolvaptan prescribing across kidney centres. Kidney centre-level initiation patterns were described, assessed using mixed-effects logistic regression and visualised with funnel plots. Cost-effectiveness analyses combined observed prescribing practices under likely negotiated commercial discounts to estimate costs and quality-adjusted life year (QALY) consequences of prescribing at the national level. Results Our study included 3,609 people with ADPKD from 72 kidney centres. Patients eligible for tolvaptan who were not initiated accounted for 34.8% (292/839). Across centres, five (6.9%) initiated tolvaptan significantly more than expected among eligible participants, while one centre (1.4%) initiated significantly less. Nationally, this could result in up to {pound}53.7 million in lost savings (assuming a 60% medication price reduction) and result in up to 1,245 lost QALYs. Patients initiated on tolvaptan who were not eligible accounted for 26.1% (103/395). Only one centre had significantly fewer eligible patients than expected among initiated patients. Nationally, this could cost up to {pound}15.9 million (assuming a 60% medication price reduction). Conclusions There is evidence of variation in tolvaptan prescribing in the UK. A substantial proportion of patients eligible for tolvaptan were not initiated at the cohort-level, with evidence of variation between centres suggesting differences in treatment decision-making. A substantial proportion of patients initiated on tolvaptan were not eligible at the cohort-level, but there was limited evidence of variation between centres. Together, these findings raise questions regarding the potential consistency of clinical decision-making, equitable access to a sole disease-modifying therapy in a rare disease, alignment with national guidance, and effective use of healthcare resources.
Ebbestad, R.; Fatehi, A.; Olauson, H.; Bozek, K.; Butt, L.; Benzing, T.; Blom, H.; Brismar, H.; Lundberg, S.; Unnersjö-Jess, D.
Show abstract
Introduction: Podocyte injury is central to the pathogenesis of most glomerulonephritides (GN) and causes segmental glomerulosclerotic lesions that predict progression in IgA Nephropathy (IgAN). Recent advances in high-resolution microscopy and AI-assisted image analysis have enabled detailed quantification of podocyte foot process (FP) morphology. However, whether nanoscale podocyte morphometrics can predict disease progression or treatment response in GN has not been investigated. Aim: To evaluate whether nanoscale podocyte morphometric parameters predict clinical characteristics, disease progression, and treatment response in GN, with a focus on IgAN. Method: Podocyte morphometrics were analyzed in kidney biopsies from patients with GN using high-resolution microscopy and the deep learning-based tool Automatic Morphometric Analysis of Podocytes (AMAP). Four morphometric parameters were quantified: slit diaphragm length (SDL), FP area, FP circularity and FP perimeter. These parameters were correlated with clinical characteristics, conventional electron microscopy (EM) findings and longitudinal follow-up data. Results: The study included 37 patients with GN from Danderyd University Hospital (Stockholm, Sweden), with IgAN representing the largest diagnostic subgroup (n = 19). The median follow-up for the cohort was 3.0 years. SDL correlated significantly with urine albumin-to-creatinine ratio (uACR; p = 0.021), whereas conventional EM measurements did not (p = 0.22). Within the IgAN subgroup, lower SDL was associated with a steeper decline in eGFR, higher FP area with increased long-term proteinuria, and higher FP circularity with improvement in uACR during the first year. The association between lower SDL and eGFR decline remained as a trend in IgAN patients not treated with corticosteroids (p = 0.068) but was absent in the treatment group (p = 0.59). Conclusion: In this proof-of-concept study, nanoscale podocyte morphometrics demonstrated greater sensitivity than conventional EM in quantifying podocyte injury and predicting progression in IgAN. These findings suggest that high-resolution morphometrics may improve risk stratification in IgAN but require validation in larger, independent cohorts before clinical implementation.
Mazumder, A.; Pintea, S. D.; Chen, L.; Mazumder, A.; Kopp, J. B.
Show abstract
Chronic kidney disease of unknown etiology (CKDu) has emerged as an important public health challenge, particularly in agricultural communities across Southern Asia and Central America. Our research aims to explore the role of environmental factors in contributing to CKDu prevalence in these regions. Using an Extreme Gradient Boosting Machine Learning (XGBoost) model, we analyzed an environmental dataset from the CKDu endemic region of Sri Lanka. The XGBoost model achieved 85% accuracy in predicting CKDu prevalence in a total of 100 locales. Significant predictor variables included fluoride concentration in water, electrical conductivity of drinking water (EC), pH, and soil type. Fluoride, a common contaminant in drinking water, was the most influential factor, followed by EC and pH, which influence the solubility and bioavailability of nephrotoxic chemicals in water sources. The study findings highlight the urgent need for targeted water analysis programs and interventions in water quality management, agrochemical usage, and soil treatment in CKDu-endemic regions. These insights also provide a framework for future research to identify causative agents and develop strategies for reducing CKDu prevalence.
Qi, J.; Zeng, P.
Show abstract
Background: Renal impairment is associated with increased risk of Parkinson's disease (PD) in general populations; however, the renal-PD link within cardiovascular disease (CVD) patients remains unclear through the high comorbidity of renal dysfunction and elevated PD risk among this special population. Objectives: To assess renal function's association, longitudinal trajectories and predictive value for PD specifically within a cardiovascular disease cohort. Methods: Among 29,266 UK Biobank CVD patients, we assessed baseline renal function via creatinine-based (eGFRcr) and cystatin C-based (eGFRcys) estimated glomerular filtration. Multivariable Cox regression analyzed associations with incident PD and all-cause mortality, with wide sensitivity analyses addressing reverse causation/confounding. Nested case-control analysis characterized pre-PD eGFR trajectories over 14 years. We finally evaluated whether renal function improved the PREDICT-PD's predictive ability. Results: Over a median 13.1-year follow-up, 489 incident PD cases and 5,919 deaths occurred. Lower eGFR levels exhibited dose-dependent associations with increased PD risk (eGFRcr: HR=0.87 [0.80~0.95]; eGFRcys: HR=0.90 [0.82~0.99]) and all-cause mortality (eGFRcr: HR=0.77 [0.75~0.79]; eGFRcys: HR=0.64 [0.63~0.66]). Pre-PD eGFR trajectories diverged significantly from controls starting over 14 years before diagnosis. eGFR-defined chronic kidney disease (<60 ml/min/1.73m2) conferred 38~60% higher PD risk and 159~234% elevated mortality risk, and could significantly enhance PREDICT-PD's discrimination, with a 1.18~1.34% increase in prediction accuracy. Conclusions: Impaired renal function is an independent PD and all-cause mortality risk factor of CVD patients, preceded by a slow, progressive eGFR decline starting >14 years before diagnosis. Incorporating renal function substantially improves PD risk prediction in this population.
Muchinga, J.; Moonga, G.; Mukumbuta, N.; Musonda, P.
Show abstract
Abstract Background Anemia is a condition characterized by nutritional deficiencies and blood disorders, predominantly affecting children aged 6 to 59 months and women of reproductive age, especially in low and middle-income countries. In Zambia, anemia is a public health problem. This study aims to assess the spatial patterns and determine factors associated with anemia severity in Zambia over six years (2018 to 2024). Method The study included a total of 19,362 WRA from the two waves of the ZDHS, 2018 and 2024. The ZDHS is a periodic national survey that uses multistage sampling. We adopted an analytical cross-sectional design, and the three-level multivariable ordinal logistic regression model was used to identify variables (individual, household, and community level) associated with anemia severity. Global Morans I, Local Morans I, and Getis-Ord Gi* statistics were used to determine the hotspots and spatial patterns, while spatial scan statistics were used to detect primary and secondary clusters and their distribution over the two cycles. Results The prevalence of anemia among women of reproductive age in Zambia was 31.0% (n=3,946) and 30.4% (n=2,015) in 2018 and 2024, respectively. The factors associated with higher odds of anemia severity were HIV status (HIV-positive: AOR=2.63, 95% CI:2.25,3.09), pregnancy (AOR=1.96, 95% CI:1.67,2.31), and rural residency (AOR=1.21, 95% CI:1.08,1.35). While being in a union was protective compared to never being in a union (AOR=0.66, 95% CI:0.57,0.77), not having financial barriers for medical assistance was equally protective. Spatial analysis showed geographic disparities and a non-random distribution of anemia (Global Morans I, 2018: I=0.147, p<0.001; 2024: I=0.130, p<0.001). the Hotspot analysis depicted an expansion of high-risk areas Western in 2018 to the North-Western and Luapula in 2024. Spatial scan analysis identified the south-west region (Western, Southern and North-Western) as the significant primary cluster of anemia consistently for both waves.
Santo Andre, H. C.; Roux, E. L.; De Jong, N. P.; Smith, P. R.; Lange, A. H.; Mendez, C.; Zahariev, A.; Mamele, M. L.; Johnson, G.; Pan, Z.; Simon, C.; Bessesen, D. H.; Pinto, A. J.; Bergouignan, A.
Show abstract
Objective: To investigate the effects of breaking up prolonged sedentary behavior (SB) on daily movement behavior and energy balance in adults with overweight/obesity. Methods: Thirty participants (16F/14M; 34.2+-7.3y; 29.5+-3.2kg/m2) were randomized to either BREAK (nine hourly 5-min brisk walking bouts) or a duration-matched intervention, ONE (45-min brisk walking), both performed 5 days/week for 6 weeks. Pre- and post-intervention, daily SB and physical activity (PA; accelerometry), body composition (doubly labeled water [DLW]), total daily energy expenditure (TDEE; DLW), appetite, and fasting leptin were measured. Linear-mixed effects models tested time effects and time-by-group interactions. Results: Only BREAK reduced prolonged SB (-8%; interaction: p=0.043). Both groups shifted SB-PA composition toward greater moderate-to-vigorous PA with proportional reductions in SB and light PA (time: all p<0.012), which were associated with increases in TDEE (+0.67 MJ/d; time: p=0.040). Body and fat mass increased in ONE only (interaction: p=0.061 and p=0.055). No differences were noted in energy intake, appetite, or leptin levels. Conclusions: Spreading short PA bouts throughout the day increases MVPA and TDEE to the same extent as a traditional continuous PA bout. Future studies should investigate whether minor differences in body composition are driven by distinct behavioral/physiological compensations influenced by the daily pattern of PA/SB.
Rocha, J. A.; Boer, P. A.; Folguieri, M. S.; Calsa, B.
Show abstract
BackgroundMaternal protein restriction results in a 28% reduction in nephrogenic cells and nephron units in rodent offspring by the 17th day of gestation compared to adequate protein intake. AimsThe present study investigates the association between growth factor expression and some developmental pathways that contribute to nephron reduction during embryonic and fetal development. Experimental DesignPregnant C57BL/6-Tg and C57BL/6J mice were assigned to either normal protein intake (NP-17%) or low protein intake (LP-6%) groups. Body weight of male offspring and kidney growth factor expression were assessed on gestation days (GD) 14 and 18. ResultsOn GD 14, LP pups exhibited a 4% higher body mass (0.1035 g) compared to NP pups (0.0995 g, p = 0.005). By GD 18, LP pups demonstrated a 4% decrease in body mass (0.939 g, p = 0.03) and a 10% increase in the number of cells per metanephric cap area. Three genes (Csf2, Il1b, Il2) were downregulated, while seven genes (Bmp2, Csf3, Fgf8, Gdnf, Bmp7, Fgf3, Ntf3) were upregulated. By GD 14, phagophores and autophagosomes in the ureteric bud increased by 197%, with further increases observed by GD 18. Bcl-2 expression increased significantly in ureteric bud cells, and mTOR activity was elevated by GD 18. ConclusionEarly gestational protein restriction modifies renal growth factor gene expression, influencing cell proliferation and autophagy, and may contribute to reduced nephron numbers by the 18th day of gestation. HIGHLIGHTSO_LIThis study examines the effects of a low-protein diet during pregnancy in mice and demonstrates a significant reduction in embryo-fetal body weight between gestational days 14 and 18. C_LIO_LIProtein restriction induces a distinct cellular pattern in the mesonephros, with a 21% increase in CAP cells at gestational day 14 (GD14), followed by a decrease by gestational day 18 (GD18) compared to offspring from mothers on a normal protein diet. C_LIO_LIAdditionally, increased expression levels of key growth factors essential for kidney development were observed at GD 14, comparing LP with NP intake during pregnancy. C_LIO_LISeven genes were upregulated (Gdnf, Bmp2, Bmp7, Tgf, Fgf8, Fgf3, Csf3, Ntf3), while three genes were downregulated (Csf2, Il1b, Il2). C_LIO_LIOverall, these findings indicate that gene regulation, autophagy, and mTOR signaling mechanisms significantly influence nephron numbers in response to gestational protein restriction beyond the 18th day of gestation. C_LI
Ahangaran, M.; Jia, S.; Chitalia, S.; Athavale, A.; Francis, J. M.; O'Donnell, M. W.; Bavi, S. R.; Gupta, U. D.; Kolachalama, V. B.
Show abstract
Background: Large Language Models (LLMs) have demonstrated strong performance in medical question-answering tasks, highlighting their potential for clinical decision support and medical education. However, their effectiveness in subspecialty areas such as nephrology remains underexplored. In this study, we assess the performance of open-source LLMs in answering multiple-choice questions from the Nephrology Self-Assessment Program (NephSAP) to better understand their capabilities and limitations within this specialized clinical domain. Methods: We evaluated the performance of five open-source large language models (LLMs): PodGPT which a podcast-pretrained model focused on STEMM disciplines, Llama 3.2-11B, Mistral-7B-Instruct-v0.2, Falcon3-10B-Instruct, and Gemma-2-9B-it. Each model was tested on its ability to answer multiple-choice questions derived from the NephSAP. Model performance was quantified using accuracy, defined as the proportion of correctly answered questions. In addition, the quality of the models explanatory responses was assessed using several natural language processing (NLP) metrics: Bilingual Evaluation Understudy (BLEU), Word Error Rate (WER), cosine similarity, and Flesch-Kincaid Grade Level (FKGL). For qualitative analysis, three board-certified nephrologists reviewed 40 randomly selected model responses to identify factual and clinical reasoning errors, with performance summarized as average error ratios based on the proportion of error-associated words per response. Results: Among the evaluated models, PodGPT achieved the highest accuracy (64.77%), whereas Llama showed the lowest performance with an accuracy of 45.08%. Qualitative analysis showed that PodGPT had the lowest factual error rate (0.017), while Llama and Falcon achieved the lowest reasoning error rates (0.038). Conclusions: This study highlights the importance of STEMM-based training to enhance the reasoning capabilities and reliability of LLMs in clinical contexts, supporting the development of more effective AI-driven decision-support tools in nephrology and other medical specialties.
wang, y.; Luo, Y.
Show abstract
Purpose: This study aimed to examine the effects of formative and summative assessments on college students tennis performance and basic psychological needs. Methods: A total of 128 undergraduate students (64 males, 64 females; Mage = 19.22, SD = 0.91) participated in this study. Participants were cluster-randomized to either a formative assessment group (n = 64) or a summative assessment group (n = 64). The formative assessment intervention involved setting personalized learning goals and success criteria, administering periodic tests, and providing process-oriented and individualized feedback. The summative assessment intervention involved setting uniform goals for all students, offering instructor feedback only on common problems, and requiring students to practice independently after class without personalized guidance. Both interventions were implemented over 10 weeks, with one 90-minute session each week. Tennis skills and basic psychological needs (i.e., autonomy, competence, and relatedness) were assessed before and after the intervention. Tennis skills were reassessed 1 week after the intervention. Two-way mixed-effects analysis of variance (ANOVA) was used to examine the impact of group, time, and their interaction on tennis skills and basic psychological needs. Results: The results showed that the interaction between group and time was significant for all of the outcome variables. Simple effects analyses indicated that, at pre-test, the two groups did not differ significantly in tennis performance or in satisfaction of autonomy, competence, and relatedness (p > 0.05). At post-intervention, the formative assessment group demonstrated significantly better performance than the summative assessment group in tennis skills (MD = 3.50, 95% CI = [1.303, 5.697], p = 0.002), autonomy (MD = 2.44, 95% CI = [1.816, 3.059], p < 0.001), relatedness (MD = 1.33, 95% CI = [0.679, 1.977], p < 0.001), and competence (MD = 1.75, 95% CI = [1.046, 2.454], p < 0.001). At the 1-week follow-up session, the formative assessment group also showed significantly better tennis performance than the summative assessment group (MD = 6.81, 95% CI = [4.667, 8.958], p < 0.001). Conclusion: Formative assessment was more effective than summative assessment in improving college students tennis performance and satisfying their basic psychological needs. These findings suggest that incorporating personalized goals, process-oriented evaluation, and individualized feedback into tennis instruction could promote both skill development and psychological outcomes in college physical education.